459 research outputs found

    Practical targeted learning from large data sets by survey sampling

    Get PDF
    We address the practical construction of asymptotic confidence intervals for smooth (i.e., path-wise differentiable), real-valued statistical parameters by targeted learning from independent and identically distributed data in contexts where sample size is so large that it poses computational challenges. We observe some summary measure of all data and select a sub-sample from the complete data set by Poisson rejective sampling with unequal inclusion probabilities based on the summary measures. Targeted learning is carried out from the easier to handle sub-sample. We derive a central limit theorem for the targeted minimum loss estimator (TMLE) which enables the construction of the confidence intervals. The inclusion probabilities can be optimized to reduce the asymptotic variance of the TMLE. We illustrate the procedure with two examples where the parameters of interest are variable importance measures of an exposure (binary or continuous) on an outcome. We also conduct a simulation study and comment on its results. keywords: semiparametric inference; survey sampling; targeted minimum loss estimation (TMLE

    Gestion des données distribuées avec le langage de rÚgles Webdamlog

    Get PDF
    Notre but est de permettre Ă  un utilisateur du Web d organiser la gestionde ses donnĂ©es distribuĂ©es en place, c est Ă  dire sans l obliger Ă  centraliserses donnĂ©es chez un unique hĂŽte. Par consĂ©quent, notre systĂšme diffĂšrede Facebook et des autres systĂšmes centralisĂ©s, et propose une alternativepermettant aux utilisateurs de lancer leurs propres pairs sur leurs machinesgĂ©rant localement leurs donnĂ©es personnelles et collaborant Ă©ventuellementavec des services Web externes.Dans ma thĂšse, je prĂ©sente Webdamlog, un langage dĂ©rivĂ© de datalogpour la gestion de donnĂ©es et de connaissances distribuĂ©es. Le langage Ă©tenddatalog de plusieurs maniĂšres, principalement avec une nouvelle propriĂ©tĂ© ladĂ©lĂ©gation, autorisant les pairs Ă  Ă©changer non seulement des faits (les donnĂ©es)mais aussi des rĂšgles (la connaissance). J ai ensuite menĂ© une Ă©tude utilisateurpour dĂ©montrer l utilisation du langage. Enfin je dĂ©cris le moteur d Ă©valuationde Webdamlog qui Ă©tend un moteur d Ă©valuation de datalog distribuĂ© nommĂ©Bud, en ajoutant le support de la dĂ©lĂ©gation et d autres innovations tellesque la possibilitĂ© d avoir des variables pour les noms de pairs et des relations.J aborde de nouvelles techniques d optimisation, notamment basĂ©es sur laprovenance des faits et des rĂšgles. Je prĂ©sente des expĂ©rimentations quidĂ©montrent que le coĂ»t du support des nouvelles propriĂ©tĂ©s de Webdamlogreste raisonnable mĂȘme pour de gros volumes de donnĂ©es. Finalement, jeprĂ©sente l implĂ©mentation d un pair Webdamlog qui fournit l environnementpour le moteur. En particulier, certains adaptateurs permettant aux pairsWebdamlog d Ă©changer des donnĂ©es avec d autres pairs sur Internet. Pourillustrer l utilisation de ces pairs, j ai implĂ©mentĂ© une application de partagede photos dans un rĂ©seau social en Webdamlog.Our goal is to enable aWeb user to easily specify distributed data managementtasks in place, i.e. without centralizing the data to a single provider. Oursystem is therefore not a replacement for Facebook, or any centralized system,but an alternative that allows users to launch their own peers on their machinesprocessing their own local personal data, and possibly collaborating with Webservices.We introduce Webdamlog, a datalog-style language for managing distributeddata and knowledge. The language extends datalog in a numberof ways, notably with a novel feature, namely delegation, allowing peersto exchange not only facts but also rules. We present a user study thatdemonstrates the usability of the language. We describe a Webdamlog enginethat extends a distributed datalog engine, namely Bud, with the supportof delegation and of a number of other novelties of Webdamlog such as thepossibility to have variables denoting peers or relations. We mention noveloptimization techniques, notably one based on the provenance of facts andrules. We exhibit experiments that demonstrate that the rich features ofWebdamlog can be supported at reasonable cost and that the engine scales tolarge volumes of data. Finally, we discuss the implementation of a Webdamlogpeer system that provides an environment for the engine. In particular, a peersupports wrappers to exchange Webdamlog data with non-Webdamlog peers.We illustrate these peers by presenting a picture management applicationthat we used for demonstration purposes.PARIS11-SCD-Bib. Ă©lectronique (914719901) / SudocSudocFranceF

    Demonstration of relativistic electron beam focusing by a laser-plasma lens

    Full text link
    Laser-plasma technology promises a drastic reduction of the size of high energy electron accelerators. It could make free electron lasers available to a broad scientific community, and push further the limits of electron accelerators for high energy physics. Furthermore the unique femtosecond nature of the source makes it a promising tool for the study of ultra-fast phenomena. However, applications are hindered by the lack of suitable lens to transport this kind of high-current electron beams, mainly due to their divergence. Here we show that this issue can be solved by using a laser-plasma lens, in which the field gradients are five order of magnitude larger than in conventional optics. We demonstrate a reduction of the divergence by nearly a factor of three, which should allow for an efficient coupling of the beam with a conventional beam transport line

    Dynamic fragmentation of graphite under laser-driven shocks: Identification of four damage regimes

    Get PDF
    This study presents the results of a large experimental campaign conducted on the Luli2000 laser facility. Thin targets of a commercial grade of porous graphite were submitted to high-power laser-driven shocks leading to their fragmentation. Many diagnostics were used such as high-speed time- and space-resolved imaging systems (shadowgraphy and photography), laser velocimetry (PDV and VISAR), debris collection and post-mortem X-ray tomography. They provided the loading levels into the targets, the spall strength of the material, the shape and size of debris and the localization of the subsurface cracks. The crossed data reduction of all the records showed their reliability and allowed to get a better insight into the damage phenomena at play in graphite. Thereby, four damage regimes, ranked according to their severity and loading level, were identified. It confirms that laser shocks are very complementary to classical impact tests (plates and spheres) since they ally two-dimensional loadings to the possibility of using both, in-situ and post-mortem diagnostics. Finally, the campaign shall be able to provide large and consistent data to develop and adjust reliable models for shock wave propagation and damage into porous graphite

    Combined Forward-Backward Asymmetry Measurements in Top-Antitop Quark Production at the Tevatron

    Get PDF
    The CDF and D0 experiments at the Fermilab Tevatron have measured the asymmetry between yields of forward- and backward-produced top and antitop quarks based on their rapidity difference and the asymmetry between their decay leptons. These measurements use the full data sets collected in proton-antiproton collisions at a center-of-mass energy of s=1.96\sqrt s =1.96 TeV. We report the results of combinations of the inclusive asymmetries and their differential dependencies on relevant kinematic quantities. The combined inclusive asymmetry is AFBttˉ=0.128±0.025A_{\mathrm{FB}}^{t\bar{t}} = 0.128 \pm 0.025. The combined inclusive and differential asymmetries are consistent with recent standard model predictions

    Robust estimation of bacterial cell count from optical density

    Get PDF
    Optical density (OD) is widely used to estimate the density of cells in liquid culture, but cannot be compared between instruments without a standardized calibration protocol and is challenging to relate to actual cell count. We address this with an interlaboratory study comparing three simple, low-cost, and highly accessible OD calibration protocols across 244 laboratories, applied to eight strains of constitutive GFP-expressing E. coli. Based on our results, we recommend calibrating OD to estimated cell count using serial dilution of silica microspheres, which produces highly precise calibration (95.5% of residuals <1.2-fold), is easily assessed for quality control, also assesses instrument effective linear range, and can be combined with fluorescence calibration to obtain units of Molecules of Equivalent Fluorescein (MEFL) per cell, allowing direct comparison and data fusion with flow cytometry measurements: in our study, fluorescence per cell measurements showed only a 1.07-fold mean difference between plate reader and flow cytometry data

    Measurement of the top quark forward-backward production asymmetry and the anomalous chromoelectric and chromomagnetic moments in pp collisions at √s = 13 TeV

    Get PDF
    Abstract The parton-level top quark (t) forward-backward asymmetry and the anomalous chromoelectric (d̂ t) and chromomagnetic (Ό̂ t) moments have been measured using LHC pp collisions at a center-of-mass energy of 13 TeV, collected in the CMS detector in a data sample corresponding to an integrated luminosity of 35.9 fb−1. The linearized variable AFB(1) is used to approximate the asymmetry. Candidate t t ÂŻ events decaying to a muon or electron and jets in final states with low and high Lorentz boosts are selected and reconstructed using a fit of the kinematic distributions of the decay products to those expected for t t ÂŻ final states. The values found for the parameters are AFB(1)=0.048−0.087+0.095(stat)−0.029+0.020(syst),Ό̂t=−0.024−0.009+0.013(stat)−0.011+0.016(syst), and a limit is placed on the magnitude of | d̂ t| < 0.03 at 95% confidence level. [Figure not available: see fulltext.

    Search for new particles in events with energetic jets and large missing transverse momentum in proton-proton collisions at root s=13 TeV

    Get PDF
    A search is presented for new particles produced at the LHC in proton-proton collisions at root s = 13 TeV, using events with energetic jets and large missing transverse momentum. The analysis is based on a data sample corresponding to an integrated luminosity of 101 fb(-1), collected in 2017-2018 with the CMS detector. Machine learning techniques are used to define separate categories for events with narrow jets from initial-state radiation and events with large-radius jets consistent with a hadronic decay of a W or Z boson. A statistical combination is made with an earlier search based on a data sample of 36 fb(-1), collected in 2016. No significant excess of events is observed with respect to the standard model background expectation determined from control samples in data. The results are interpreted in terms of limits on the branching fraction of an invisible decay of the Higgs boson, as well as constraints on simplified models of dark matter, on first-generation scalar leptoquarks decaying to quarks and neutrinos, and on models with large extra dimensions. Several of the new limits, specifically for spin-1 dark matter mediators, pseudoscalar mediators, colored mediators, and leptoquarks, are the most restrictive to date.Peer reviewe
    • 

    corecore